Viewpoint Invariant Collective Activity Recognition with Relative Action Context

نویسندگان

  • Takuhiro Kaneko
  • Masamichi Shimosaka
  • Shigeyuki Odashima
  • Rui Fukui
  • Tomomasa Sato
چکیده

This paper presents an approach for collective activity recognition. Collective activities are activities performed by multiple persons, such as queueing in a line and talking together. To recognize them, the action context (AC) descriptor [1] encodes the “apparent” relation (e.g. a group crossing and facing “right”), however this representation is sensitive to viewpoint change. We instead propose a novel feature representation called the relative action context (RAC) descriptor that encodes the “relative” relation (e.g. a group crossing and facing the “same” direction). This representation is viewpoint invariant and complementary to AC; hence we employ a simplified combinational classifier. This paper also introduces two methods to accelerate performance. First, to make the contexts robust to various situations, we apply post processes. Second, to reduce local classification failures, we regularize the classification using fully connected CRFs. Experimental results show that our method is applicable to various scenes and outperforms state-of-the art methods.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Viewpoint Manifolds for Action Recognition

Action recognition from video is a problem that hasmany important applications to humanmotion analysis. In real-world settings, the viewpoint of the camera cannot always be fixed relative to the subject, so view-invariant action recognition methods are needed. Previous view-invariant methods use multiple cameras in both the training and testing phases of action recognition or require storing ma...

متن کامل

A fast, invariant representation for human action in the visual system.

Humans can effortlessly recognize others' actions in the presence of complex transformations, such as changes in viewpoint. Several studies have located the regions in the brain involved in invariant action recognition; however, the underlying neural computations remain poorly understood. We use magnetoencephalography decoding and a data set of well-controlled, naturalistic videos of five actio...

متن کامل

Fast, invariant representation for human action in the visual system

The ability to recognize the actions of others from visual input is essential to humans’ daily lives. The neural computations underlying action recognition, however, are still poorly understood. We use magnetoencephalography (MEG) decoding and a computational model to study action recognition from a novel dataset of well-controlled, naturalistic videos of five actions (run, walk, jump, eat, dri...

متن کامل

Toward a Real Time View-invariant 3D Action Recognition

In this paper we propose a novel human action recognition method, robust to viewpoint variation, which combines skeletonand depth-based action recognition approaches. For this matter, we first build several base classifiers, to independently predict the action performed by a subject. Then, two efficient combination strategies, that take into account skeleton accuracy and human body orientation,...

متن کامل

Invariant recognition drives neural representations of action sequences

Recognizing the actions of others from visual stimuli is a crucial aspect of human perception that allows individuals to respond to social cues. Humans are able to discriminate between similar actions despite transformations, like changes in viewpoint or actor, that substantially alter the visual appearance of a scene. This ability to generalize across complex transformations is a hallmark of h...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012